Differential privacy (DP) is in our smart phones, web browsers, social media, and the federal statistics used to allocate billions of dollars. Despite the mathematical concept being only 17 years old, differential privacy has amassed a rapidly growing list of real‐world applications, such as Meta and US Census Bureau data. Why is DP so pervasive? DP is currently the only mathematical framework that provides a finite and quantifiable bound on disclosure risk when releasing information from confidential data. Previous concepts of data privacy and confidentiality required various assumptions about how a bad actor might attack sensitive data. DP is often called formally private because statisticians can mathematically prove the worst‐case scenario privacy loss that could result from releasing information based on the confidential data. Although DP ushered in a new era of data privacy and confidentiality methodologies, many researchers and data practitioners criticize differentially private frameworks. In this paper, we provide readers a critical overview of the current state‐of‐the‐art research on formal privacy methodologies and various relevant perspectives, challenges, and opportunities.
This article is categorized under: Applications of Computational Statistics > Defense and National Security